236 research outputs found

    Biocover - Whole landfill methane emission

    Get PDF

    Real-time whole-genome sequencing for routine typing, surveillance, and outbreak detection of verotoxigenic Escherichia coli.

    Get PDF
    Fast and accurate identification and typing of pathogens are essential for effective surveillance and outbreak detection. The current routine procedure is based on a variety of techniques, making the procedure laborious, time-consuming, and expensive. With whole-genome sequencing (WGS) becoming cheaper, it has huge potential in both diagnostics and routine surveillance. The aim of this study was to perform a real-time evaluation of WGS for routine typing and surveillance of verocytotoxin-producing Escherichia coli (VTEC). In Denmark, the Statens Serum Institut (SSI) routinely receives all suspected VTEC isolates. During a 7-week period in the fall of 2012, all incoming isolates were concurrently subjected to WGS using IonTorrent PGM. Real-time bioinformatics analysis was performed using web-tools (www.genomicepidemiology.org) for species determination, multilocus sequence type (MLST) typing, and determination of phylogenetic relationship, and a specific VirulenceFinder for detection of E. coli virulence genes was developed as part of this study. In total, 46 suspected VTEC isolates were characterized in parallel during the study. VirulenceFinder proved successful in detecting virulence genes included in routine typing, explicitly verocytotoxin 1 (vtx1), verocytotoxin 2 (vtx2), and intimin (eae), and also detected additional virulence genes. VirulenceFinder is also a robust method for assigning verocytotoxin (vtx) subtypes. A real-time clustering of isolates in agreement with the epidemiology was established from WGS, enabling discrimination between sporadic and outbreak isolates. Overall, WGS typing produced results faster and at a lower cost than the current routine. Therefore, WGS typing is a superior alternative to conventional typing strategies. This approach may also be applied to typing and surveillance of other pathogens

    Computation in Physical Systems: A Normative Mapping Account

    Get PDF
    The relationship between abstract formal procedures and the activities of actual physical systems has proved to be surprisingly subtle and controversial, and there are a number of competing accounts of when a physical system can be properly said to implement a mathematical formalism and hence perform a computation. I defend an account wherein computational descriptions of physical systems are high-level normative interpretations motivated by our pragmatic concerns. Furthermore, the criteria of utility and success vary according to our diverse purposes and pragmatic goals. Hence there is no independent or uniform fact to the matter, and I advance the ‘anti-realist’ conclusion that computational descriptions of physical systems are not founded upon deep ontological distinctions, but rather upon interest-relative human conventions. Hence physical computation is a ‘conventional’ rather than a ‘natural’ kind

    Is Evolution Algorithmic?

    Get PDF
    In Darwin’s Dangerous Idea, Daniel Dennett claims that evolution is algorithmic. On Dennett’s analysis, evolutionary processes are trivially algorithmic because he assumes that all natural processes are algorithmic. I will argue that there are more robust ways to understand algorithmic processes that make the claim that evolution is algorithmic empirical and not conceptual. While laws of nature can be seen as compression algorithms of information about the world, it does not follow logically that they are implemented as algorithms by physical processes. For that to be true, the processes have to be part of computational systems. The basic difference between mere simulation and real computing is having proper causal structure. I will show what kind of requirements this poses for natural evolutionary processes if they are to be computational

    Intercomparison of detection and quantification methods for methane emissions from the natural gas distribution network in Hamburg, Germany

    Get PDF
    In August and September 2020, three different measurement methods for quantifying methane (CH4) emissions from leaks in urban gas distribution networks were applied and compared in Hamburg, Germany: the “mobile”, “tracer release”, and “suction” methods. The mobile and tracer release methods determine emission rates to the atmosphere from measurements of CH4 mole fractions in the ambient air, and the tracer release method also includes measurement of a gaseous tracer. The suction method determines emission rates by pumping air out of the ground using soil probes that are placed above the suspected leak location. The quantitative intercomparison of the emission rates from the three methods at a small number of locations is challenging because of limitations of the different methods at different types of leak locations. The mobile method was designed to rapidly quantify the average or total emission rate of many gas leaks in a city, but it yields a large emission rate uncertainty for individual leak locations. Emission rates determined for individual leak locations with the tracer release technique are more precise because the simultaneous measurement of the tracer released at a known rate at the emission source eliminates many of the uncertainties encountered with the mobile method. Nevertheless, care must be taken to properly collocate the tracer release and the leak emission points to avoid biases in emission rate estimates. The suction method could not be completed or applied at locations with widespread subsurface CH4 accumulation or due to safety measures. While the number of gas leak locations in this study is small, we observe a correlation between leak emission rate and subsurface accumulation. Wide accumulation places leaks into a safety category that requires immediate repair so that the suction method cannot be applied to these larger leaks in routine operation. This introduces a sampling bias for the suction method in this study towards the low-emission leaks, which do not require immediate repair measures. Given that this study is based on random sampling, such a sampling bias may also exist for the suction method outside of this study. While an investigation of the causal relationship between safety category and leak size is beyond the scope of this study, on average higher emission rates were observed from all three measurement-based quantification methods for leaks with higher safety priority compared to the leaks with lower safety concern. The leak locations where the suction method could not be applied were the biggest emitters, as confirmed by the emission rate quantifications using mobile and tracer methods and an engineering method based on the leak's diameter, pipeline overpressure, and depth at which the pipeline is buried. The corresponding sampling bias for the suction technique led to a low bias in derived emission rates in this study. It is important that future studies using the suction method account for any leaks not quantifiable with this method in order to avoid biases, especially when used to inform emission inventories.</p

    Microbiological risk assessment

    Get PDF
    Microbiological risk assessment is defined by the CODEX Alimentarius Commission as 'a scientifically based process consisting of the following steps: (i) hazard identification; (ii) hazard characterisation; (iii) exposure assessment; and (iv) risk characterisation'. It is one of the components of microbiological risk analysis, which has the overall objective to minimise food-borne risks to consumers. It is a complex discipline that continues to evolve and challenges and new opportunities were discussed during the breakout session 'Microbiological risk assessment' held at the EFSA 2nd Scientific Conference 'Shaping the Future of Food Safety, Together' (Milan, Italy, 14–16 October 2015). Discussions focussed on the estimation of the global burden of food-borne disease, the prioritisation of microbiological risks taking into account uncertainty, the challenges in risk assessment when dealing with viruses, the contribution of typing methods to risk assessment and approaches to deal with uncertainty in risk assessment in emergency situations. It was concluded that the results of the global burden of food-borne disease study provide, for the first time, a comprehensive comparison of risks due to different hazards and this will be an important input to food safety strategies at the global, regional and national levels. Risk ranking methodologies are an important tool for priority setting. It is important to consider the underestimation (e.g. due to bias in reporting). Typing methods for microbial hazards inevitably impact on risk assessment and can have an important influence on the accuracy of source attribution studies. Due to their high genetic diversity and the limitations of current diagnostic methods, it is still challenging to obtain robust evidence for food-borne outbreaks caused by viruses and more research is needed on the use of whole genome sequencing in this area. The lessons learnt from the recent enterohaemorrhagic Escherichia coli (EHEC) outbreak in Germany include the need for more effective and timely connections within and between institutions as responses unfold

    Subjects, Topics, and Anchoring to the Context

    Get PDF
    The article discusses the connection between the syntactic and semantic properties of weak, strong, and referential DP subjects. In particular, I argue that nominal expressions possess a situation argument and that their interpretation and their distribution follow from the presuppositional requirements that the determiner imposes on the individual argument and situation argument of its complement nominal. These presuppositional requirements, I then argue, are embodied by local relations of the subject to a distinct head in the C domain, Fin(0) in the system of Rizzi 1997, where specific referential values of discourse antecedents are accessible
    • 

    corecore